🇬🇧 en ru 🇷🇺

Markov process noun

  • (probability theory) Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).
марковски́й проце́сс
Wiktionary Links